260 research outputs found

    Bayesian Model Averaging in Vector Autoregressive Processes with an Investigation of Stability of the US Great Ratios and Risk of a Liquidity Trap in the USA, UK and Japan.

    Get PDF
    A Bayesian model averaging procedure is presented within the class of vector autoregressive (VAR) processes and applied to two empirical issues. First, stability of the Great Ratios in U.S. macro-economic time series is investigated, together with the presence and effects of permanent shocks. Measures on manifolds are employed in order to elicit uniform priors on subspaces defined by particular structural features of linear VARs. Second, the VAR model is extended to include a smooth transition function in a (monetary) equation and stochastic volatility in the disturbances. The risk of a liquidity trap in the USA, UK and Japan is evaluated, together with the expected cost of a policy adjustment of central banks. Posterior probabilities of different models are evaluated usingMarkov chainMonte Carlo techniques.

    Evidence on a DSGE Business Cycle model subject to Neutral and Investment-Specific Technology Shocks using Bayesian Model Averaging

    Get PDF
    The empirical support for a DSGE type of real business cycle model with two technology shocks is evaluated using a Bayesian model averaging procedure that makes use of a finite mixture of many models within the class of vector autoregressive (VAR) processes. The linear VAR model is extended to permit equilibrium restrictions and restrictions on long-run responses to technology shocks apart from having a range of lag structures and deterministic processes. These model features are weighted as posterior probabilites and computed using MCMC and analytical methods. Uncertainty exists as to the most appropriate model for our data, with five models receiving significant support. The model set used has substantial implications for the results obtained. We do find support for a number of features implied by the real business cycle model. Business cycle volatility seems more due to investment specific technology shocks than neutral technology shocks and this result is robust to model specification. These techonolgy schocks appear to account for all stochastic trends in our system after 1984. we provide evidence on the uncertainty bands associated with these results.

    Model Uncertainty and Bayesian Model Averaging in Vector Autoregressive Processes

    Get PDF
    Economic forecasts and policy decisions are often informed by empirical analysis based on econometric models. However, inference based upon a single model, when several viable models exist, limits its usefulness. Taking account of model uncertainty, a Bayesian model averaging procedure is presented which allows for unconditional inference within the class of vector autoregressive (VAR) processes. Several features of VAR process are investigated. Measures on manifolds are employed in order to elicit uniform priors on subspaces defined by particular structural features of VARs. The features considered are the number and form of the equilibrium economic relations and deterministic processes. Posterior probabilities of these features are used in a model averaging approach for forecasting and impulse response analysis. The methods are applied to investigate stability of the "Great Ratios" in U.S. consumption, investment and income, and the presence and effects of permanent shocks in these series. The results obtained indicate the feasibility of the proposed method.Posterior probability; Grassman manifold; Orthogonal group; Cointegration; Model averaging; Stochastic trend; Impulse response; Vector autoregressive model.

    Improper priors with well defined Bayes Factors

    Get PDF
    While some improper priors have attractive properties, it is generally claimed that Bartlett’s paradox implies that using improper priors for the parameters in alternative models results in Bayes factors that are not well defined, thus preventing model comparison in this case. In this paper we demonstrate, using well understood principles underlying what is already common practice, that this latter result is not generally true and so expand the class of priors that may be used for computing posterior odds to two classes of improper priors: the shrink age prior; and a prior based upon a nesting argument. Using a new representation of the issue of undefined Bayes factors, we develop classes of improper priors from which well defined Bayes factors result. However, as the use of such priors is not free of problems, we include discussion on the issues with using such priors for model comparison.Improper prior; Bayes factor; marginal likelihood; shrinkage prior; measure

    Adaptive Mixture of Student-t Distributions as a Flexible Candidate Distribution for Efficient Simulation: The R Package AdMit

    Get PDF
    This paper presents the R package AdMit which provides flexible functions to approximate a certain target distribution and to efficiently generate a sample of random draws from it, given only a kernel of the target density function. The core algorithm consists of the function AdMit which fits an adaptive mixture of Student-t distributions to the density of interest. Then, importance sampling or the independence chain Metropolis-Hastings algorithm is used to obtain quantities of interest for the target density, using the fitted mixture as the importance or candidate density. The estimation procedure is fully automatic and thus avoids the time-consuming and difficult task of tuning a sampling algorithm. The relevance of the package is shown in two examples. The first aims at illustrating in detail the use of the functions provided by the package in a bivariate bimodal distribution. The second shows the relevance of the adaptive mixture procedure through the Bayesian estimation of a mixture of ARCH model fitted to foreign exchange log-returns data. The methodology is compared to standard cases of importance sampling and the Metropolis-Hastings algorithm using a naive candidate and with the Griddy-Gibbs approach.

    Simulation based Bayesian econometric inference: principles and some recent computational advances

    Get PDF
    In this paper we discuss several aspects of simulation based Bayesian econometric inference. We start at an elementary level on basic concepts of Bayesian analysis; evaluating integrals by simulation methods is a crucial ingredient in Bayesian inference. Next, the most popular and well-known simulation techniques are discussed, the MetropolisHastings algorithm and Gibbs sampling (being the most popular Markov chain Monte Carlo methods) and importance sampling. After that, we discuss two recently developed sampling methods: adaptive radial based direction sampling [ARDS], which makes use of a transformation to radial coordinates, and neural network sampling, which makes use of a neural network approximation to the posterior distribution of interest. Both methods are especially useful in cases where the posterior distribution is not well-behaved, in the sense of having highly non-elliptical shapes. The simulation techniques are illustrated in several example models, such as a model for the real US GNP and models for binary data of a US recession indicator.

    Robust Optimization of the Equity Momentum Strategy

    Get PDF
    Quadratic optimization for asset portfolios often leads to error maximization, with optimizers zooming in on large errors in the predicted inputs, that is, expected returns and risks. The consequence in most cases is a poor real-time performance. In this paper we show how to improve real-time performance of the popular equity momentum strategy with robust optimization in an empirical application involving 1500-2500 US stocks over the period 1963-2006. We also show that popular procedures like Bayes-Stein estimated expected returns, shrinking the covariance matrix and adding weight constraints fail in such a practical case

    Classical and Bayesian aspects of robust unit root inference

    Get PDF
    This paper has two themes. First, we classify some effects which outliers in the data have on unit root inference. We show that, both in a classical and a Bayesian framework, the presence of additive outliers moves ‘standard’ inference towards stationarity. Second, we base inference on an independent Student-t instead of a Gaussian likelihood. This yields results that are less sensitive to the presence of outliers. Application to several time series with outliers reveals a negative correlation between the unit root and degrees of freedom parameter of the Student-t distribution. Therefore, imposing normality may incorrectly provide evidence against the unit root

    Simulation based bayesian econometric inference: principles and some recent computational advances.

    Get PDF
    In this paper we discuss several aspects of simulation based Bayesian econometric inference. We start at an elementary level on basic concepts of Bayesian analysis; evaluating integrals by simulation methods is a crucial ingredient in Bayesian inference. Next, the most popular and well-known simulation techniques are discussed, the Metropolis-Hastings algorithm and Gibbs sampling (being the most popular Markov chain Monte Carlo methods) and importance sampling. After that, we discuss two recently developed sampling methods: adaptive radial based direction sampling [ARDS], which makes use of a transformation to radial coordinates, and neural network sampling, which makes use of a neural network approximation to the posterior distribution of interest. Both methods are especially useful in cases where the posterior distribution is not well-behaved, in the sense of having highly non-elliptical shapes. The simulation techniques are illustrated in several example models, such as a model for the real US GNP and models for binary data of a US recession indicator
    corecore